Evolution and generalization of a single neurone: I. Single-layer perceptron as seven statistical classifiers

نویسنده

  • Sarunas Raudys
چکیده

Unlike many other investigations on this topic, the present one considers the non-linear single-layer perceptron (SLP) as a process in which the weights of the perceptron are increasing, and the cost function of the sum of squares is changing gradually. During the backpropagation training, the decision boundary of of SLP becomes identical or close to that of seven statistical classifiers: (1) the Euclidean distance classifier, (2) the regularized linear discriminant analysis, (3) the standard Fisher linear discriminant function, (4) the Fisher linear discriminant function with a pseudoinverse covariance matrix, (5) the generalized Fisher discriminant function, (6) the minimum empirical error classifier, and (7) the maximum margin classifier. In order to obtain a wider range of classifiers, five new complexity-control techniques are proposed: target value control, moving of the learning data centre into the origin of coordinates, zero weight initialization, use of an additional negative weight decay term called "anti-regularization", and use of an exponentially increasing learning step. Which particular type of classifier will be obtained depends on the data, the cost function to be minimized, the optimization technique and its parameters, and the stopping criteria.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

Multiple Classifiers System for Reducing Influences of Atypical Observations

Atypical observations, which are called outliers, are one of difficulties to apply standard Gaussian density based pattern classification methods. Large number of outliers makes distribution densities of input features multimodal. The problem becomes especially challenging in highdimensional feature space. To tackle atypical observations, we propose multiple classifiers systems (MCSs) whose bas...

متن کامل

Exploring the behaviour of base classifiers in credit scoring ensembles

Many techniques have been proposed for credit risk assessment, from statistical models to artificial intelligence methods. During the last few years, different approaches to classifier ensembles have successfully been applied to credit scoring problems, demonstrating to be more accurate than single prediction models. However, it is still a question what base classifiers should be employed in ea...

متن کامل

Trainable fusion rules. II. Small sample-size effects

Profound theoretical analysis is performed of small-sample properties of trainable fusion rules to determine in which situations neural network ensembles can improve or degrade classification results. We consider small sample effects, specific only to multiple classifiers system design in the two-category case of two important fusion rules: (1) linear weighted average (weighted voting), realize...

متن کامل

Ensemble strategies to build neural network to facilitate decision making

There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies pro-duce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based u...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 11 2  شماره 

صفحات  -

تاریخ انتشار 1998